Web Survey Bibliography
Even if respondents do not ask for help and advice when answering a survey question, they might misunderstand its intended meaning. Providing definitions of key terms may improve validity (Conrad & Schober 2000, Peytchev et al. 2010) and is apart from that the only way of helping respondents in self-administered surveys. In Web Surveys however plain written text is not always recognized and participants seldom use definitions even though it is just one or two clicks away (Conrad et al. 2006). In addition to definitions regarding the concept measured in a survey question, written in-structions regarding the formatting of a response are important features in survey settings where no interviewer is present. Studies concerning this issue mainly deal with the presenta-tion of answer elements (e.g., Christian, Dillman & Smith 2007). However, with the invention of Web 2.0 technologies researchers have the opportunity to make use of dynamic tooltip elements in order to convey crucial information in addition to the question wording. The aim of such strategy would be to increase the proportion of res-pondents who actually take notice of the definition or instruction when answering a survey question which in turn should increase response quality. In 2011 we conducted two field experimental studies among students and employees of the Technical University of Darmstadt as well as among the general public (N1 = 3,268, N2=1,000). As part of the respective questionnaires respondents answered various questions with and without definitions of key terms of the survey question and with and without in-struction on how to fill in the answer. In a between-subjects design respondents were ran-domly assigned to an experimental condition using either tooltip technology to convey defi-nitions or instructions. Controls versions made use of either a static HTML design or pro-vided no definition or instruction at all. In order to assess the effects of the tooltip technology on respondents, we will compare re-sponse distributions in order to determine whether respondents noticed the definition or instruction and whether they considered this additional information when answering the question. Results indicate, that instructions conveyed using tooltip technology seem to be considered by respondents. More than half of the respondents exposed to the tooltip version activated the tooltip for 3 or more seconds indicating that they had enough time to actually read it. Also, compared to the version without instructions and with static instructions in the tooltip version more respondents answered a questions formally correct.
Workshop Homepage (abstract) / (presentation)
Web survey bibliography (4086)
- Current Projects at University of Ljubljana; 2011; Lozar Manfreda, K.
- Collecting information with Knowledge technologies; 2011; Foulonneau, M.
- E-dater, Artificial Actors, and German Households; 2011; Hebing, M.
- DIME-SHS; 2011; Lesnard, L.
- Web based Data Collection – A Jungle becoming a Field or a Field becoming a Jungle? ; 2011; Ole Finnemann, N.
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2011; 2011
- Calibrating Non-Probability Internet Samples with Probability Samples Using Early Adopter Characteristics...; 2011; DiSogra, C., Cobb, C. L., Chan, E., Dennis, J. M.
- How Visual Design Affects the Interpretability of Survey Questions; 2011; Toepoel, V., Dillman, D. A.
- True Longitudinal and Probability-Based Internet Panels: Evidence from the Netherlands; 2011; Das, M., Scherpenzeel, A.
- Web Survey Methodology: Interface Design, Sampling and Statistical Inference; 2011; Couper, M. P.
- Effect of interview modes on measurement of identity; 2011; Nandi, A., Platt, L.
- Maintaining Cross-Sectional Representativeness in a Longitudinal General Population Survey ; 2011; Lynn, P.
- Understanding Society Innovation Panel Wave 3: Results from Methodological Experiments; 2011; Burton, J., Budd, S., Gilbert, E., Jaeckle, A., McFall, S., Uhrig, S. C. N.
- The Effect of a Mixed Mode Wave on Subsequent Attrition in a Panel Survey: Evidence from the Understanding...; 2011; Lynn, P.
- Seeing Through the Eyes of the Respondent: An Eye-tracking Study on Survey Question Comprehension; 2011; Lenzner, A., Kaczmirek, L., Galesic, M.
- Eye Tracking in testing questionnaires: What’s the added value?; 2011; Tries, S.
- Panel Recruitment via Facebook; 2011; Toepoel, V.
- Usability and burden measurement in online forms; 2011; Thomsen, P.
- Dynamic Data Editing in online data collection for the Vacant Positions Survey; 2011; Stax, H.-P.
- Utilizing Web Technology in Business Data Collection: Some Norwegian, Dutch and Danish Experiences; 2011; Snijkers, G., Haraldsen, G., Stax, H.-P.
- Web survey software; 2011; Slavec, A., Berzelak, N., Vehovar, V.
- Disentangling relative mode effects for the web survey mode in the Safety Monitor; 2011; Schouten, B., van de Brakel, J., Buelens, B., Klausch, L. T., van der Laan, J.
- Improving validity in web surveys with hard‐to‐reach targets: Online Respondent Driven Sampling...; 2011; Mavletova, A. M.
- Developing Electronic Questionnaires at Statistics Canada: Experiences and Challenges in a Changing...; 2011; Lawrence, D.
- Experiences with mixed mode mail & web-enquêtes in probability samples with known individuals; 2011; Kalgraff Skjak, K., Kolsrud, K.
- Effects of internet data collection in business surveys – the case of the Dutch SBS; 2011; Giesen, D.
- Ignoring the compatibility of online questionnaires may bias the psychological composition of your sample...; 2011; Funke, F.
- Video enhanced web survey; 2011; Fuchs, M., Kunz, T., Gebhard, F.
- Keeping Up Appearances: Maintaining standards during strategic changes in electronic reporting; 2011; Farrell, E., Hewett, K.
- Respondent engagement: using usability testing; 2011; Dowling, Z.
- Scrolling or paging - it depends; 2011; Blanke, K.
- Research Applications for Mobile Data Collection; 2011; Fawson, B.
- Results from Real-Time Data Collection (RTD) vs. Data from Traditional Panelists: Is it valid to combine...; 2011; Pingitore, G., Witten, S., Walker, A., Seldin, D., Ellrodt, R., Muns, N., Parks, C., Serrato, C.
- Widening the Net: A Comparison of Online Intercept and Access Panel Sampling; 2011; Bakken, D. G., Nawani, R.
- Making it fit: how survey technology providers are responding to the challenges of handling web surveys...; 2011; Macer, T.
- Probably the Best Bias in the World?; 2011; Dent, T.
- Optimus Modus: Comparing interviewing modes for visitor surveys; 2011; Stanley, N., Jenkins, S.
- The development of the KubeMatrix™ as a mobile app for Market Research Online Communities; 2011; Birks, D. F., Wilson, De.
- Online Research – Game On!: A look at how gaming techniques can transform your online research; 2011; Puleston, J.
- Engagement, Consistency, Reach – why the Technology Landscape Precludes All Three; 2011; Johnson, A., Rolfe, G.
- The use of paradata to improve data collection at Statistics Canada: Empirical results and research; 2011; Gambino, J., Wrighte, D.
- Medium Node: NSF Census Research Network; 2011; McCutcheon, A. L., Belli, R. F., Olson, K., Smyth, J. D., Soh, L.-K.
- A new online building survey system; 2011; Wang, Yic.
- A Comparison of Internet-Based Participant Recruitment Methods: Engaging the Hidden Population of Cannabis...; 2011; Temple, E. C., Brown, R. F.
- The German Access Panel and the Impact of Response Propensities; 2011; Amarov, B., Enderle, T., Muennich, R., Rendtel, U., Zins, S.
- Web Survey Process within the Concept of eSocial Sciences; 2011; Vehovar, V.
- Can biomarkers be collected in an Internet survey? A pilot study in the LISS panel; 2011; Avendano, M., Mackenbach, J., Scherpenzeel, A.
- Innovations in survey sampling design: Discussion of three contributions presented at the U.S. Census...; 2011; Opsomer, J.
- A Bayesian analysis of small area probabilities under a constraint; 2011; Nandram, B., Sayit, H.
- Adaptive network and spatial sampling; 2011; Thompson, S. K.